Tight Bounds for the VC-Dimension of Feedforward and Recurrent Networks of Piecewise Polynomial Activation Function Units

نویسنده

  • Akito SAKURAI
چکیده

We consider the VC-dimension of a set of the neural networks of depth s with w adjustable parameters that has h hidden units of activation functions of at most q segments of degree at most d polynomials. When d 2 and q 2, the VC-dimension is O(ws(s log d + log(qh=s))), O(ws((h=s) log q) + log d), and (ws log(dqh=s)). When d 2 and q = 1, it is (ws log d). When d = 1 and q 2, it is (ws log(qh=s)). When d = 0 and q > 2, it is O(ws log(qh=s)) and (w log h). When d = 0 and q = 2, it is (w log h). The recurrent networks we consider is the feedforward network equiped with feedback connections with unit time delay. Let r be the number of repetitions along the feedbacks. Then the VC-dimension is O(wrs(log r + s log d + log(qh=s))), O(wrs((h=s) log q) + log rd), and (wrs log(dqh=s)) for d 2 and q 2; (wrs log d) for d 2 and q = 1; (wrs log(qh=s)) for d = 1 and q 2; O(wrs log(qh=s)) and (wr log h) for d = 0 and q > 2. O(wr log(rh), O(wh) (for any r), (wr) (for r h), and (wh) (for any r) for d = 0 and q = 2.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Vapnik-Chervonenkis Dimension of Recurrent Neural Networks

Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, po...

متن کامل

Tight Bounds for the VC-Dimension of Piecewise Polynomial Networks

O(ws(s log d+log(dqh/ s))) and O(ws((h/ s) log q) +log(dqh/ s)) are upper bounds for the VC-dimension of a set of neural networks of units with piecewise polynomial activation functions, where s is the depth of the network, h is the number of hidden units, w is the number of adjustable parameters, q is the maximum of the number of polynomial segments of the activation function, and d is the max...

متن کامل

Nearly-tight VC-dimension bounds for piecewise linear neural networks

We prove new upper and lower bounds on the VC-dimension of deep neural networks with the ReLU activation function. These bounds are tight for almost the entire range of parameters. Letting W be the number of weights and L be the number of layers, we prove that the VC-dimension is O(WL log(W )), and provide examples with VC-dimension Ω(WL log(W/L)). This improves both the previously known upper ...

متن کامل

Almost Linear VC Dimension Bounds for Piecewise Polynomial Networks

We compute upper and lower bounds on the VC dimension and pseudo-dimension of feedforward neural networks composed of piecewise polynomial activation functions. We show that if the number of layers is fixed, then the VC dimension and pseudo-dimension grow as WlogW, where W is the number of parameters in the network. This result stands in opposition to the case where the number of layers is unbo...

متن کامل

Bounds for the Computational

It is shown that feedforward neural nets of constant depth with piecewise polynomial activation functions and arbitrary real weights can be simulated for boolean inputs and outputs by neural nets of a somewhat larger size and depth with heaviside gates and weights from f0; 1g. This provides the rst known upper bound for the computational power and VC-dimension of such neural nets. It is also sh...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007